AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Aozora Bunko pre-training

# Aozora Bunko pre-training

Roberta Small Japanese Aozora
A small Japanese RoBERTa model pre-trained on Aozora Bunko texts, suitable for various downstream NLP tasks
Large Language Model Transformers Japanese
R
KoichiYasuoka
19
0
Roberta Small Japanese Luw Upos
A RoBERTa model pre-trained on Aozora Bunko texts for Japanese POS tagging and dependency parsing.
Sequence Labeling Transformers Supports Multiple Languages
R
KoichiYasuoka
1,545
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase